# Hierarchical Parameter Allocation

Openelm 1 1B
OpenELM is a series of efficient language models introduced by Apple, utilizing a hierarchical scaling strategy to optimize parameter allocation, offering pretrained and instruction-tuned models ranging from 270M to 3B parameters.
Large Language Model Transformers
O
apple
683
31
Openelm 450M Instruct
OpenELM is a set of open-source efficient language models that employ a hierarchical scaling strategy to optimize parameter allocation, including pre-trained and instruction-tuned versions ranging from 270 million to 3 billion parameters.
Large Language Model Transformers
O
apple
114.41k
47
Openelm 3B
OpenELM is a set of open-source efficient language models that employ a hierarchical scaling strategy to optimize parameter allocation and improve model accuracy. It includes four parameter scales: 270M, 450M, 1.1B, and 3B, offering both pre-trained and instruction-tuned versions.
Large Language Model Transformers
O
apple
1,436
123
Openelm 450M
OpenELM is a set of open, efficient language models that employ a hierarchical scaling strategy to optimize parameter allocation and improve model accuracy. It offers pretrained and instruction-tuned versions ranging from 270 million to 3 billion parameters.
Large Language Model Transformers
O
apple
857
26
Openelm 270M
OpenELM is a set of open-source efficient language models that adopt a hierarchical scaling strategy to efficiently allocate parameters in each layer of the Transformer model, improving accuracy.
Large Language Model Transformers
O
apple
4,719
74
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase